首页> 外文OA文献 >A novel adaptive kernel method with kernel centers determined by a support vector regression approach
【2h】

A novel adaptive kernel method with kernel centers determined by a support vector regression approach

机译:一种新的自适应核方法,其核中心由支持向量回归法确定

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

The optimality of the kernel number and kernel centers plays a significant role in determining the approximation power of nearly all kernel methods. However, the process of choosing optimal kernels is always formulated as a global optimization task, which is hard to accomplish. Recently, an algorithm, namely improved recursive reduced least squares support vector regression (IRR-LSSVR), was proposed for establishing a global nonparametric offline model, which demonstrates significant advantage in choosing representing and fewer support vectors compared with others. Inspired by the IRR- LSSVR, a new adaptive parametric kernel method called WV-LSSVR is proposed in this paper using the same type of kernels and the same centers as those used in the IRR-LSSVR. Furthermore, inspired by the multikernel semiparametric support vector regression, the effect of the kernel extension is investigated in a recursive regression framework, and a recursive kernel method called GPK-LSSVR is proposed using a compound type of kernels which are recommended for Gaussian process regression. Numerical experiments on benchmark data sets confirm the validity and effectiveness of the presented algorithms. The WV-LSSVR algorithm shows higher approximation accuracy than the recursive parametric kernel method using the centers calculated by the k-means clustering approach. The extended recursive kernel method (i.e. GPK-LSSVR) has not shown advantage in terms of global approximation accuracy when validating the test data set without real-time updation, but it can increase modeling accuracy if the real-time identification is involved.
机译:内核数和内核中心的最优性在确定几乎所有内核方法的逼近能力中都起着重要作用。但是,选择最佳内核的过程始终被公式化为全局优化任务,这很难完成。最近,提出了一种算法,即改进的递归最小二乘支持向量回归(IRR-LSSVR),用于建立全局非参数离线模型,与其他算法相比,该模型在选择表示向量和减少支持向量方面显示出显着优势。在IRR-LSSVR的启发下,本文提出了一种新的自适应参数核方法,称为WV-LSSVR,它使用与IRR-LSSVR相同的核类型和相同的中心。此外,受多核半参数支持向量回归的启发,在递归回归框架中研究了核扩展的影响,并提出了一种建议用于高斯过程回归的复合核类型的递归核方法GPK-LSSVR。在基准数据集上的数值实验证实了所提出算法的有效性和有效性。与使用k均值聚类方法计算的中心的递归参数核方法相比,WV-LSSVR算法显示出更高的逼近精度。扩展递归核方法(即GPK-LSSVR)在验证测试数据集而不进行实时更新时没有显示出全局逼近精度方面的优势,但是如果涉及实时识别,则可以提高建模精度。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号